C and Cplusplus

Posted by Sam on Mar 30, 2008 at 12:00 AM UTC - 5 hrs
Let me make a request for help and a quick announcement, and then I'll get you back to your regularly scheduled on-topic reading:

I need a good C/C++ IDE
I've been doing a lot of work in C++ lately for bioinformatics, and DevC++ is just not going to make the cut. My friend Michael suggested I use Visual Studio, but I thought I'd throw this out there and see what everyone else thought and try out a few more.

I'd like it to work on Windows, but I wouldn't mind hearing some Mac choices for the fun of it. Ideally, it would have a lot of the features of IntelliJ IDEA, but if it's not that awesome, I could probably get by. DevC++ is just broken for me. I won't go into too much detail, as I think those guys are providing a good service and I'm not helping them out myself, but sometimes headers get that do long has been the least of my troubles.

I'd like to know of both free and paid versions.

I'm on Twitter
I've finally started using my twitter account. I started the account a while back, but never really "got it." I guess the other day the light bulb went off in my head. It's like email + IRC + instant messaging + blogging all in one.

Anyway, if you're on twitter and want to start following me, I'll get notified and probably start following you as well. Of course, if I start getting too many updates, I'll randomly stop following some people. Try not to take it personally if that happens.

I try not to give the minute details of my life like "I just woke up" or "I'm voting for so-and-so." Instead, I've been trying to stay on topic of this blog (programming and technology), but with small thoughts about whatever I happen to be working on. Of course, you'll find some responses to other people won't always be on my main topic.

I hope to see you on there. And if you can help in the C++ IDE department, please let me know!

Hey! Why don't you make your life easier and subscribe to the full post or short blurb RSS feed? I'm so confident you'll love my smelly pasta plate wisdom that I'm offering a no-strings-attached, lifetime money back guarantee!



Posted by Sam on Dec 05, 2008 at 12:00 AM UTC - 5 hrs
Shitty variable names, unnecessary usage of pointer arithmetic, and clever tricks are WTFs.

When I saw this WTF code that formats numbers to add commas, I had forgotten how low-level C really is. Even after using C++ fairly regularly for the last several months.

With 500+ upvotes, the people over at the reddit thread regarding the WTF apparently think the real WTF is the post, not the code.
nice_code, stupid_submitter - in which TheDailyWTF jumps the shark by ridiculing perfectly good code.
Let's forgive the misuse of the wornout phrase and get to whether or not looking at the code should result in utterance of WTFs.


It goes something like this:
  1. Setup, in which we use globally shared memory
  2. Negate the number if it is negative, for no reason I can think of, and set a reminder flag.
  3. Set buffer to point to the final spot. Move it back one, dereference it, and insert the null string terminator there.
  4. Move the buffer backwards. Dereference the pointer and set the value there to the character '0'. Add the remainder of the number divided by ten to that, since '1' is 1 away from '0', and so forth.
  5. Divide the number by 10 so we can get to the next digit.
  6. If we've done the loop 3 times, move the buffer pointer back one location and insert a comma.
  7. Repeat starting at step 4 until the number is 0.
  8. Cleanup - if the front character is a comma, remove it by moving the buffer pointer forward.
  9. Cleanup - Move the buffer pointer backwards and insert our '-' if the negative flag has been set.
I felt like that required too much brain power to follow for doing something so simple. So I decided to make my own version. I thought I'd try the following:
  1. Copy the number to a string.
  2. Starting at the end, copy each character to another string, inserting a comma every 3rd time.
A thousand times simpler than the convoluted mess of nice_num. Here's that attempt, in C:

char* commadify(long num, char* result) { int input_len = num_digits(num); char* input_number = malloc( input_len * sizeof(char) ); sprintf(input_number, "%ld", num); int number_of_commas = (input_len-1) / 3; int result_len = input_len + number_of_commas; int input_index = input_len-1; int result_index, count=0; for (result_index=result_len-1; result_index>=0; result_index--) { if( count == 3 ) { result[result_index] = ','; result_index --; count = 0; } result[result_index] = input_number[input_index]; input_index --; count++; } free(input_number); return result; }

I think it's clearer - but not by much. Certainly the variable names are better because you don't first have to first understand what's going on to know what they are for. I think moving the pointer arithmetic back into the more-familiar array notation helps understandability. And removing the trick of knowing that the ASCII codes for '1' is 1 more than for '0' ... for '9' is 9 more than 0 means less thinking.

Commadify is 4 times slower than nice_num

On the negative side, the commadify code is slower than nice_num, most of which is caused by using malloc instead of having preallocated memory. Removing those two instances and using preallocated memory shaves a couple of tenths of a second off of the one-million-runs loop. But you have to compensate with more code that keeps track of the start position.

So what's the verdict? I don't think we're reacting to the WTFs I mentioned above when we see the nice_num code. I think we're reacting to C itself. We're so used to very high level languages that common tricks and things you'd know as a C programmer are the WTFs to us.

This kind of stuff isn't outside the realm of what a strong programmer should know. It isn't even close to the border. The truth is our low-level skills are out of practice and we should probably get some.

What do you think?

Code with main program and comparison with nice_num is available at my github repository, miscellany/commadify.

Update: Thanks to Dave Kirby's comment, I've fixed memory leaks and updated the code in this post and at the repository. Link to repo originally was to the specific commit - so I've changed that to link to master instead.


Posted by Sam on Apr 22, 2009 at 12:00 AM UTC - 5 hrs
slarbi@nibbler> make
make: 'dwight_conrad' is up to date.

slarbi@nibbler> make anyway
make: *** No rule to make target 'anyway'. Stop.

slarbi@nibbler> make rule to make target anyway
make: *** No rule to make target 'rule'. Stop.

slarbi@nibbler> alias makeanyway='make -B' #ohthatmakesalotofsense
slarbi@nibbler> makeanyway
...
c++ main.o -o dwight_conrad -g

slarbi@nibbler> thank you
-bash: thank: command not found


Posted by Sam on Jan 14, 2008 at 06:42 AM UTC - 5 hrs
This is a story about my journey as a programmer, the major highs and lows I've had along the way, and how this post came to be. It's not about how ecstasy made me a better programmer, so I apologize if that's why you came.

In any case, we'll start at the end, jump to the beginning, and move along back to today. It's long, but I hope the read is as rewarding as the write.

A while back, Reg Braithwaite challenged programing bloggers with three posts he'd love to read (and one that he wouldn't). I loved the idea so much that I've been thinking about all my experiences as a programmer off and on for the last several months, trying to find the links between what I learned from certain languages that made me a better programmer in others, and how they made me better overall. That's how this post came to be.


The experiences discussed herein were valuable in their own right, but the challenge itself is rewarding as well. How often do we pause to reflect on what we've learned, and more importantly, how it has changed us? Because of that, I recommend you perform the exercise as well.

I freely admit that some of this isn't necessarily caused by my experiences with the language alone - but instead shaped by the languages and my experiences surrounding the times.

One last bit of administrata: Some of these memories are over a decade old, and therefore may bleed together and/or be unfactual. Please forgive the minor errors due to memory loss.


How QBASIC Made Me A Programmer

As I've said before, from the time I was very young, I had an interest in making games. I was enamored with my Atari 2600, and then later the NES. I also enjoyed a playground game with Donald Duck and Spelunker.

Before I was 10, I had a notepad with designs for my as-yet-unreleased blockbuster of a side-scrolling game that would run on my very own Super Sanola game console (I had the shell designed, not the electronics).

It was that intense interest in how to make a game that led me to inspect some of the source code Microsoft provided with QBASIC. After learning PRINT, INPUT, IF..THEN, and GOTO (and of course SomeLabel: to go to) I was ready to take a shot at my first text-based adventure game.

The game wasn't all that big - consisting of a few rooms, the NEWS directions, swinging of a sword against a few monsters, and keeping track of treasure and stats for everything - but it was a complete mess.

The experience with QBASIC taught me that, for any given program of sufficient complexity, you really only need three to four language constructs:
  1. Input
  2. Output
  3. Conditional statements
  4. Control structures
Even the control structures may not be necessary there. Why? Suppose you know a set of operations will be performed an unknown but arbitrary amount of times. Suppose also that it will be performed less than X number of times, where X is a known quantity smaller than infinity. Then you can simply write out X number of conditionals to cover all the cases. Not efficient, but not a requirement either.

Unfortunately, that experience and its lesson stuck with me for a while. (Hence, the title of this weblog.)

Side Note: The number of language constructs I mentioned that are necessary is not from a scientific source - just from the top of my head at the time I wrote it. If I'm wrong on the amount (be it too high or too low), I always appreciate corrections in the comments.


What ANSI Art taught me about programming

When I started making ANSI art, I was unaware of TheDraw. Instead, I opened up those .ans files I enjoyed looking at so much in MS-DOS Editor to see how it was done. A bunch of escape codes and blocks came together to produce a thing of visual beauty.

ANSI Art picture of a face

Since all I knew about were the escape codes and the blocks (alt-177, 178, 219-223 mostly), naturally I used the MS-DOS Editor to create my own art. The limitations of the medium were strangling, but that was what made it fun.

And I'm sure you can imagine the pain - worse than programming in an assembly language (at least for relatively small programs). Nevertheless, the experience taught me some valuable lessons:
  • Even though we value people over tools, don't underestimate the value of a good tool. In fact, when attempting anything new to you, see if there's a tool that can help you. Back then, I was on local BBSs, and not the 1337 ones when I first started out. Now, the Internet is ubiquitous. We don't have an excuse anymore.

  • I can now navigate through really bad code (and code that is limited by the language) a bit easier than I might otherwise have been able to do. I might have to do some experimenting to see what the symbols mean, but I imagine everyone would. And to be fair, I'm sure years of personally producing such crapcode also has something to do with my navigation abilities.

  • Perhaps most importantly, it taught me the value of working in small chunks and taking baby steps. When you can't see the result of what you're doing, you've got to constantly check the results of the latest change, and most software systems are like that. Moreover, when you encounter something unexpected, an effective approach is to isolate the problem by isolating the code. In doing so, you can reproduce the problem and problem area, making the fix much easier.

The Middle Years (included for completeness' sake)

The middle years included exposure to Turbo Pascal, MASM, C, and C++, and some small experiences in other places as well. Although I learned many lessons, there are far too many to list here, and most are so small as to not be significant on their own. Therefore, they are uninteresting for the purposes of this post.

However, there were two lessons I learned from this time (but not during) that are significant:
  1. Learn to compile your own $&*@%# programs (or, learn to fish instead of asking for them).
  2. Stop being an arrogant know-it-all prick and admit you know nothing.
As you can tell, I was quite the cowboy coding young buck. I've tried to change that in recent years.


How ColdFusion made me a better programmer when I use Java

Although I've written a ton of bad code in ColdFusion, I've also written a couple of good lines here and there. I came into ColdFusion with the experiences I've related above this, and my early times with it definitely illustrate that fact. I cared nothing for small files, knew nothing of abstraction, and horrendous god-files were created as a result.

If you're a fan of Italian food, looking through my code would make your mouth water.

DRY principle? Forget about it. I still thought code reuse meant copy and paste.

Still, ColdFusion taught me one important aspect that got me started on the path to Object Oriented Enlightenment: Database access shouldn't require several lines of boilerplate code to execute one line of SQL.

Because of my experience with ColdFusion, I wrote my first reusable class in Java that took the boilerplating away, let me instantiate a single object, and use it for queries.


How Java taught me to write better programs in Ruby, C#, CF and others

It was around the time I started using Java quite a bit that I discovered Uncle Bob's Principles of OOD, so much of the improvement here is only indirectly related to Java.

Sure, I had heard about object oriented programming, but either I shrugged it off ("who needs that?") or didn't "get" it (or more likely, a combination of both).

Whatever it was, it took a couple of years of revisiting my own crapcode in ColdFusion and Java as a "professional" to tip me over the edge. I had to find a better way: Grad school here I come!

The better way was to find a new career. I was going to enter as a Political Scientist and drop programming altogether. I had seemingly lost all passion for the subject.

Fortunately for me now, the political science department wasn't accepting Spring entrance, so I decide to at least get started in computer science. Even more luckily, that first semester Venkat introduced me to the solution to many my problems, and got me excited about programming again.

I was using Java fairly heavily during all this time, so learning the principles behind OO in depth and in Java allowed me to extrapolate that for use in other languages. I focused on principles, not recipes.

On top of it all, Java taught me about unit testing with JUnit. Now, the first thing I look for when evaluating a language is a unit testing framework.


What Ruby taught me that the others didn't

My experience with Ruby over the last year or so has been invaluable. In particular, there are four lessons I've taken (or am in the process of taking):
  • The importance of code as data, or higher-order functions, or first-order functions, or blocks or closures: After learning how to appropriately use yield, I really miss it when I'm using a language where it's lacking.

  • There is value in viewing programming as the construction of lanugages, and DSLs are useful tools to have in your toolbox.

  • Metaprogramming is OK. Before Ruby, I used metaprogramming very sparingly. Part of that is because I didn't understand it, and the other part is I didn't take the time to understand it because I had heard how slow it can make your programs.

    Needless to say, after seeing it in action in Ruby, I started using those features more extensively everywhere else. After seeing Rails, I very rarely write queries in ColdFusion - instead, I've got a component that takes care of it for me.

  • Because of my interests in Java and Ruby, I've recently started browsing JRuby's source code and issue tracker. I'm not yet able to put into words what I'm learning, but that time will come with some more experience. In any case, I can't imagine that I'll learn nothing from the likes of Charlie Nutter, Ola Bini, Thomas Enebo, and others. Can you?

What's next?

Missing from my experience has been a functional language. Sure, I had a tiny bit of Lisp in college, but not enough to say I got anything out of it. So this year, I'm going to do something useful and not useful in Erlang. Perhaps next I'll go for Lisp. We'll see where time takes me after that.

That's been my journey. What's yours been like?

Now that I've written that post, I have a request for a post I'd like to see:

What have you learned from a non-programming-related discipline that's made you a better programmer?

Last modified on Jan 16, 2008 at 07:09 AM UTC - 5 hrs

Posted by Sam on Sep 25, 2007 at 06:39 AM UTC - 5 hrs
The last bit of advice from Chad Fowler's 52 ways to save your job was to be a generalist, so this week's version is the obvious opposite: to be a specialist.

The intersection point between the two seemingly disparate pieces of advice is that you shouldn't use your lack of experience in multiple technologies to call yourself a specialist in another. Just because you develop in Java to the exclusion of .NET (or anything else) doesn't make you a Java specialist. To call yourself that, you need to be "the authority" on all things Java.


Chad mentions a measure he used to assess a job candidate's depth of knowledge in Java: a question of how to make the JVM crash.

I'm definitely lacking in this regard. I've got a pretty good handle on Java, Ruby, and ColdFusion. I've done a small amount of work in .NET and have been adding to that recently. I can certainly write a program that will crash - but can I write one to crash the virtual machine (or CLR)?

I can relunctantly write small programs in C/C++, but I'm unlikely to have the patience to trace through a large program for fun. I might even still be able to figure out some assembly language if you gave me enough time. Certainly in these lower level items it's not hard to find a way to crash. It's probably harder to avoid it, in fact.

In ColdFusion, I've crashed the CF Server by simply writing recursive templates (those that cfinclude themselves). (However, I don't know if that still works.) In Java and .NET, I wouldn't know where to start. What about crashing a browser with JavaScript?

So Chad mentions that you should know the internals of JVM and CLR. I should know how JavaScript works in the browser and not just how to getElementById(). With that in mind, these things are going on the to-learn list - the goal being to find a way to crash each of them.

Ideas?

Last modified on Sep 25, 2007 at 06:41 AM UTC - 5 hrs

Posted by Sam on Sep 02, 2007 at 03:48 PM UTC - 5 hrs
Bioinformatics is one area of computing where you'll still want to pay special attention to performance. With the human genome consisting of 3 billion bases, using one byte per base gives you three gigabytes of data to work with. Clearly, something that gives you only a constant reduction in computational complexity can result in huge time savings.

Because of that concern for performance, I expect to be working in C++ regularly this semester. In fact, the first day of class was a nice review of it, and I welcome the change since it's been many years since I've done much of anything in the language.


One thing that struck me as particularly painful was memory management and pointers. When was the last time you had to remember to delete [] p;? The power of being able to do such low-level manipulation may be inebriating, but you better not get too drunk. How ever would you be able to keep the entire program in your head? (Paul Graham's timing was amazing, as I saw he posted that article about 10 minutes before my re-introduction to C++).

C++ works against that goal on so many levels, particularly with the indirection pointers provide. Something like this simple program is relatively easy to understand and remember:

#include
#include

usingnamespacestd;

intmain(intargc,char*argv[])
{
int *i=newint(1);
*i=1;
cout << *i;
delete[]i;

system("PAUSE");
returnEXIT_SUCCESS;
}

It is easy to see that i is a pointer to a location in heap memory that's holding data to be interpreted as an integer. To set or get that value you need to dereference the pointer, using the unary * operator.

But what happens when you increase the complexity a little? Here we'll take a reference to a pointer to int.

intprintn(int*&n)
{
cout<< *n;
}

The idea stays the same, and is still relatively simple. But you can tell it is starting to get harder to decide what's going on. This program sets a variable and prints it. Can you imagine working with pointers to pointers or just a couple of hundred lines of this? Three cheers for the people that do.

What if we change it a bit?

intprintn(int*n)
{
cout<< *n;
}

Are we passing a pointer by value, an int by reference, or is something else going on?

It makes me wonder how many times people try adding or removing a * when trying to fix broken code, as opposed to actually tracing through it and understanding what is going on. I recall doing a lot of that as an undergrad.

I'm not convinced mapping everything out would have been quicker. (I'm not convinced throwing asterisks around like hira shuriken was either.) One thing is for sure though - getting back into C++ will make my head hurt, probably more than trying to understand the real bioinformatics subject matter.

Last modified on Sep 02, 2007 at 03:49 PM UTC - 5 hrs


Web CodeOdor.com

Me
Picture of me

Topics
.NET (19)
AI/Machine Learning (14)
Answers To 100 Interview Questions (10)
Bioinformatics (2)
Business (1)
C and Cplusplus (6)
cfrails (22)
ColdFusion (78)
Customer Relations (15)
Databases (3)
DRY (18)
DSLs (11)
Future Tech (5)
Games (5)
Groovy/Grails (8)
Hardware (1)
IDEs (9)
Java (38)
JavaScript (4)
Linux (2)
Lisp (1)
Mac OS (4)
Management (15)
MediaServerX (1)
Miscellany (76)
OOAD (37)
Productivity (11)
Programming (168)
Programming Quotables (9)
Rails (31)
Ruby (67)
Save Your Job (58)
scriptaGulous (4)
Software Development Process (23)
TDD (41)
TDDing xorblog (6)
Tools (5)
Web Development (8)
Windows (1)
With (1)
YAGNI (10)

Resources
Agile Manifesto & Principles
Principles Of OOD
ColdFusion
CFUnit
Ruby
Ruby on Rails
JUnit



RSS 2.0: Full Post | Short Blurb
Subscribe by email:

Delivered by FeedBurner



AltStyle によって変換されたページ (->オリジナル) /